Supplementary Materials: Trading Computation for Communication: Distributed Stochastic Dual Coordinate Ascent
نویسنده
چکیده
For the proof of Theorem 1, we first prove the following Lemma. Lemma 1. Assume that φ * i (z) is γ-strongly convex function (where γ can be zero). Then for any t > 0 and s ∈ [0, 1], we have
منابع مشابه
Trading Computation for Communication: Distributed Stochastic Dual Coordinate Ascent
We present and study a distributed optimization algorithm by employing a stochastic dual coordinate ascent method. Stochastic dual coordinate ascent methods enjoy strong theoretical guarantees and often have better performances than stochastic gradient descent methods in optimizing regularized loss minimization problems. It still lacks of efforts in studying them in a distributed framework. We ...
متن کاملDistributed Asynchronous Dual-Free Stochastic Dual Coordinate Ascent
In this paper, we propose a new Distributed Asynchronous Dual-Free Coordinate Ascent method (dis-dfSDCA), and prove that it has linear convergence rate in convex case. Stochastic Dual Coordinate Ascent (SDCA) is a popular method in solving regularized convex loss minimization problems. Dual-Free Stochastic Dual Coordinate Ascent (dfSDCA) method is a variation of SDCA, and can be applied to a mo...
متن کاملNetwork Constrained Distributed Dual Coordinate Ascent for Machine Learning
With explosion of data size and limited storage space at a single location, data are often distributed at different locations. We thus face the challenge of performing largescale machine learning from these distributed data through communication networks. In this paper, we study how the network communication constraints will impact the convergence speed of distributed machine learning optimizat...
متن کاملStochastic Dual Coordinate Ascent with Adaptive Probabilities: Supplementary material
Proofs We shall need the following inequality.
متن کاملCommunication-Efficient Distributed Dual Coordinate Ascent
Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. In this paper, we propose a communication-efficient framework, COCOA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of al...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013